2 research outputs found
Machine cosmology: investigating the dark sector through novel inference methods
Cosmology during the last few decades has experienced an influx of new theory
and observations, pushed forward by ever-increasing capabilities of current and
upcoming large-scale surveys, computational and methodological capabilities,
and new theoretical work being fueled by these latter factors. Observational
measurements often carry uncertainties from noise or random processes, with
inference methods being concerned with inverse probability as the quest to explore
underlying distributions of data. Over the same time frame, Bayesian statistics
has thus quickly found itself in a central role in cosmological analysis, as the
field is rife with inverse problems such as hypothesis testing, model selection, and
parameter estimation. More recently, inference models from the field of machine
learning have also experienced a surge in applications to cosmology. We delve
into the utility of such inference methods for challenges in cosmology in different
degrees of granularity and focusing on the dark sector of our Universe, traveling
from the largest scale to more local problems in the process.
Starting in the area of cosmological parameter estimation, we develop a novel
parallel-iterative parameter estimation method rooted in Bayesian nonparametrics and recent developments in variational inference from the field of
machine learning in Chapter 2. In doing so, we propose, implement, and
test a new approach to fast high-dimensional parameter estimation in an
embarrassingly parallel manner. For this work, we make use of large-scale
supercomputing facilities to speed up the functional extraction of cosmological
parameter posteriors based on data from the Dark Energy Survey. Next, we
concentrate on the dark energy equation of state in Chapter 3, stress-testing its
imprint on type Ia supernovae measurements through an introduced random curve
generator for smooth function perturbation. We then investigate the robustness
of standard model analyses based on such data with regard to deviations from a
cosmological constant in the form of a redshift-dependent equation of state.
With regard to large-scale structure, we show the advantages of density ridges
as curvilinear principal curves from Dark Energy Survey weak lensing data for
cosmic trough identification in Chapter 4. Denoising large-scale structure in this
way allows for the more fine-grained identification of structural components in the
cosmic web. We also compare the results of our extended version of the subspace-constrained mean shift algorithm to curvelet denoising as an alternative method,
as well as trough structure from measurements of the foreground matter density
field. Lastly, in the area of galaxy formation and evolution, we combine analytic
formalisms and machine learning methods in a hybrid prediction framework in
Chapter 5. We use a two-step process to populate dark matter haloes taken from
the SIMBA cosmological simulation with baryonic galaxy properties of interest.
For this purpose, we use the equilibrium model of galaxy evolution as a precursory
module to enable an improved prediction of remaining baryonic properties as a
way to quickly complete cosmological simulations
Lagged correlation-based deep learning for directional trend change prediction in financial time series
Trend change prediction in complex systems with a large number of noisy time
series is a problem with many applications for real-world phenomena, with stock
markets as a notoriously difficult to predict example of such systems. We
approach predictions of directional trend changes via complex lagged
correlations between them, excluding any information about the target series
from the respective inputs to achieve predictions purely based on such
correlations with other series. We propose the use of deep neural networks that
employ step-wise linear regressions with exponential smoothing in the
preparatory feature engineering for this task, with regression slopes as trend
strength indicators for a given time interval. We apply this method to
historical stock market data from 2011 to 2016 as a use case example of lagged
correlations between large numbers of time series that are heavily influenced
by externally arising new information as a random factor. The results
demonstrate the viability of the proposed approach, with state-of-the-art
accuracies and accounting for the statistical significance of the results for
additional validation, as well as important implications for modern financial
economics.Comment: 11 pages, 4 figure